# Text corruption task
Ro Bart 1024
This is a BART base model pre-trained from scratch with 140 million parameters, based on a 50GB Romanian text corpus.
Large Language Model
Transformers Other

R
Iulian277
85
0
Ro Bart 512
This is a BART base model pre-trained from scratch, specifically designed for Romanian text processing tasks.
Large Language Model
Transformers Other

R
Iulian277
27
0
Featured Recommended AI Models